Goto

Collaborating Authors

 prevent distortion


Curvature Regularization to Prevent Distortion in Graph Embedding

Neural Information Processing Systems

Recent research on graph embedding has achieved success in various applications. Most graph embedding methods preserve the proximity in a graph into a manifold in an embedding space. We argue an important but neglected problem about this proximity-preserving strategy: Graph topology patterns, while preserved well into an embedding manifold by preserving proximity, may distort in the ambient embedding Euclidean space, and hence to detect them becomes difficult for machine learning models. To address the problem, we propose curvature regularization, to enforce flatness for embedding manifolds, thereby preventing the distortion. We present a novel angle-based sectional curvature, termed ABS curvature, and accordingly three kinds of curvature regularization to induce flat embedding manifolds during graph embedding. We integrate curvature regularization into five popular proximity-preserving embedding methods, and empirical results in two applications show significant improvements on a wide range of open graph datasets.


Review for NeurIPS paper: Curvature Regularization to Prevent Distortion in Graph Embedding

Neural Information Processing Systems

Additional Feedback: * One philosophical question that comes to mind when reading the three observations in the Introduction and while going over the example in Figure 1 is the following: Could it be that the exact reason that the representation methods learn interesting patterns is the fact that they are allowed to twist and curve the space as required, in order to bring nodes that are far apart in terms of graph distance close together in terms of Euclidean distance? It is not obvious to me that constraining the optimization algorithm of this ability can only have positive outcome. There's a parallel to be drawn here with the kernel trick in Support Vector Machines, where we are allowed to embed the data in a higher dimension, where the classes become linearly separable. This way, the sum could be defined over (q', q'') \in \Gamma_{i, j} * The sectional curvature could have been more thoroughly introduced. Unfortunately, for a paper that is heavily based on geometric notions, there's a clear shortage of pictures.


Curvature Regularization to Prevent Distortion in Graph Embedding

Neural Information Processing Systems

Recent research on graph embedding has achieved success in various applications. Most graph embedding methods preserve the proximity in a graph into a manifold in an embedding space. We argue an important but neglected problem about this proximity-preserving strategy: Graph topology patterns, while preserved well into an embedding manifold by preserving proximity, may distort in the ambient embedding Euclidean space, and hence to detect them becomes difficult for machine learning models. To address the problem, we propose curvature regularization, to enforce flatness for embedding manifolds, thereby preventing the distortion. We present a novel angle-based sectional curvature, termed ABS curvature, and accordingly three kinds of curvature regularization to induce flat embedding manifolds during graph embedding.